algorithmic auditing
Law and the Emerging Political Economy of Algorithmic Audits
Terzis, Petros, Veale, Michael, Gaumann, Noëlle
For almost a decade now, scholarship in and beyond the ACM FAccT community has been focusing on novel and innovative ways and methodologies to audit the functioning of algorithmic systems. Over the years, this research idea and technical project has matured enough to become a regulatory mandate. Today, the Digital Services Act (DSA) and the Online Safety Act (OSA) have established the framework within which technology corporations and (traditional) auditors will develop the `practice' of algorithmic auditing thereby presaging how this `ecosystem' will develop. In this paper, we systematically review the auditing provisions in the DSA and the OSA in light of observations from the emerging industry of algorithmic auditing. Who is likely to occupy this space? What are some political and ethical tensions that are likely to arise? How are the mandates of `independent auditing' or `the evaluation of the societal context of an algorithmic function' likely to play out in practice? By shaping the picture of the emerging political economy of algorithmic auditing, we draw attention to strategies and cultures of traditional auditors that risk eroding important regulatory pillars of the DSA and the OSA. Importantly, we warn that ambitious research ideas and technical projects of/for algorithmic auditing may end up crashed by the standardising grip of traditional auditors and/or diluted within a complex web of (sub-)contractual arrangements, diverse portfolios, and tight timelines.
- Oceania > Australia (0.28)
- North America > Canada (0.14)
- Europe > United Kingdom > England > Oxfordshire > Oxford (0.14)
- (15 more...)
Bias in facial recognition isn't hard to discover, but it's hard to get rid of
Joy Buolamwini is a researcher at the MIT Media Lab who pioneered research into bias that's built into artificial intelligence and facial recognition. And the way she came to this work is almost a little too on the nose. As a graduate student at MIT, she created a mirror that would project aspirational images onto her face, like a lion or tennis star Serena Williams. But the facial-recognition software she installed wouldn't work on her Black face, until she literally put on a white mask. Buolamwini is featured in a documentary called "Coded Bias," airing tonight on PBS.
The Algorithmic Auditing Trap
This op-ed was written by Mona Sloane, a sociologist and senior research scientist at the NYU Center for Responsible A.I. and a fellow at the NYU Institute for Public Knowledge. Her work focuses on design and inequality in the context of algorithms and artificial intelligence. We have a new A.I. race on our hands: the race to define and steer what it means to audit algorithms. Governing bodies know that they must come up with solutions to the disproportionate harm algorithms can inflict. This technology has disproportionate impacts on racial minorities, the economically disadvantaged, womxn, and people with disabilities, with applications ranging from health care to welfare, hiring, and education.
- North America > United States > New York (0.06)
- Europe > Netherlands > North Holland > Amsterdam (0.05)
- Europe > Finland > Uusimaa > Helsinki (0.05)